GADGET SVM: a Gossip-bAseD sub-GradiEnT SVM solver
نویسندگان
چکیده
Distributed environments such as federated databases, wireless and sensor networks, Peer-to-Peer (P2P) networks are becoming increasingly popular and wellsuited for machine learning since they can store large quantities of data on a network. The distributed setting is complex in part because network topologies are often dynamic and data available to algorithms changes frequently. Furthermore, in many distributed scenarios (such as sensor networks) nodes may have limited resources. Distributed Data Mining (DDM, (Kargupta & Chan, 2000), (Demers et al., 2002), (Guo & (editors), 1999), (Provost, 2000)) and Machine Learning algorithms created for these settings must have high utility, use little communication cost, work on dynamic networks and be computationally efficient.
منابع مشابه
: Primal Estimated sub - GrAdient SOlver for SVM
We describe and analyze a simple and effective stochastic sub-gradient descent algorithm for solving the optimization problem cast by Support Vector Machines (SVM). We prove that the number of iterations required to obtain a solution of accuracy is Õ(1/ ), where each iteration operates on a single training example. In contrast, previous analyses of stochastic gradient descent methods for SVMs r...
متن کاملF-SVM: Combination of Feature Transformation and SVM Learning via Convex Relaxation
The generalization error bound of support vector machine (SVM) depends on the ratio of radius and margin, while standard SVM only considers the maximization of the margin but ignores the minimization of the radius. Several approaches have been proposed to integrate radius and margin for joint learning of feature transformation and SVM classifier. However, most of them either require the form of...
متن کاملSymbolic computing of LS-SVM based models
This paper introduces a software tool SYM-LS-SVM-SOLVER written in Maple to derive the dual system and the dual model representation of LS-SVM based models, symbolically. SYM-LS-SVM-SOLVER constructs the Lagrangian from the given objective function and list of constraints. Afterwards it obtains the KKT (Karush-Kuhn-Tucker) optimality conditions and finally formulates a linear system in terms of...
متن کاملLearning geometric combinations of Gaussian kernels with alternating Quasi-Newton algorithm
We propose a novel algorithm for learning a geometric combination of Gaussian kernel jointly with a SVM classifier. This problem is the product counterpart of MKL, with restriction to Gaussian kernels. Our algorithm finds a local solution by alternating a Quasi-Newton gradient descent over the kernels and a classical SVM solver over the instances. We show promising results on well known data se...
متن کاملDual Solver for the Multiplicative Kernel Structural SVM
This manuscript describes the implementation of a mini-batch dual solver for the multiplicative kernel structural SVM used in [1]. The solver is written from scratch (except for wrapping around a QP solver), and uses dual coordinate ascent style updates, similar to SMO [2, 3, 4], SDCA [5], and D. Ramanan’s linear structural SVM solver [6]. The use of a mini-batch update strategy resulted in a 1...
متن کامل